187 research outputs found

    Auto-tuning Distributed Stream Processing Systems using Reinforcement Learning

    Get PDF
    Fine tuning distributed systems is considered to be a craftsmanship, relying on intuition and experience. This becomes even more challenging when the systems need to react in near real time, as streaming engines have to do to maintain pre-agreed service quality metrics. In this article, we present an automated approach that builds on a combination of supervised and reinforcement learning methods to recommend the most appropriate lever configurations based on previous load. With this, streaming engines can be automatically tuned without requiring a human to determine the right way and proper time to deploy them. This opens the door to new configurations that are not being applied today since the complexity of managing these systems has surpassed the abilities of human experts. We show how reinforcement learning systems can find substantially better configurations in less time than their human counterparts and adapt to changing workloads

    Non-Linear Multiple Field Interactions Neural Document Ranking

    Get PDF
    Ranking tasks are usually based on the text of the main body of the page and the actions (clicks) of users on the page. There are other elements that could be leveraged to better contextualise the ranking experience (e.g. text in other fields, query made by the user, images, etc). We present one of the first in-depth analyses of field interaction for multiple field ranking in two separate datasets. While some works have taken advantage of full document structure, some aspects remain unexplored. In this work we build on previous analyses to show how query-field interactions, non-linear field interactions, and the architecture of the underlying neural model affect performance

    Disaggregated Memory at the Edge

    Get PDF
    This paper describes how to augment techniques such as Distributed Shared Memory with recent trends on disaggregated Non Volatile Memory in the data centre so that the combination can be used in an edge environment with potentially volatile and mobile resources. This article identifies the main advantages and challenges, and offers an architectural evolution to incorporate recent research trends into production-ready disaggregated edges. We also present two prototypes showing the feasibility of this proposal

    Performance evaluation of SiPM detectors for PET imaging in the presence of magnetic fields

    Get PDF
    Proceeding of: 2008 IEEE Nuclear Science Symposium Conference Record (NSS '08), Dresden, Germany, 19-25 Oct. 2008The multi-pixel photon counter (MPPC) or silicon photo-multiplier (SiPM), recently introduced as a solid-state photodetector, consists of an array of Geiger-mode photodiodes(microcells). is a promising device for PET thanks to its potential for high photon detection efficiency (PDE) and immunity to high magnetic fields. is also very easy to use, with simple electronic read-out, high gain and small size. In this work we evaluate the performance of three 1 x 1 mm2 and one 6 x 6 mm2 (2 x 2 array) SiPMs offered by Hamamatsu for their use in PET. We examine the dependence of the energy resolution and the gain of these devices on the thermal and reverse bias when coupled to LYSO scintillator crystals. We find that the 400 and 1600 microcells models and the 2 x 2 array are suitable for small size crystals, like those employed in high resolution small animal scanners. The good performance of these devices up to 7 Tesla has also been confirmed.This work was supported in part by the MEC (FPA2007-07393), CDTEAM (CENIT-Ingenio 2010) Ministerio de Industria, Spain, UCM (Grupos UCM: 910059), CPAN (ConsoliderIngenio 2010) CSPD-2007-00042 projects, and the RECAVA-RETIC network

    Research challenges in nextgen service orchestration

    Get PDF
    Fog/edge computing, function as a service, and programmable infrastructures, like software-defined networking or network function virtualisation, are becoming ubiquitously used in modern Information Technology infrastructures. These technologies change the characteristics and capabilities of the underlying computational substrate where services run (e.g. higher volatility, scarcer computational power, or programmability). As a consequence, the nature of the services that can be run on them changes too (smaller codebases, more fragmented state, etc.). These changes bring new requirements for service orchestrators, which need to evolve so as to support new scenarios where a close interaction between service and infrastructure becomes essential to deliver a seamless user experience. Here, we present the challenges brought forward by this new breed of technologies and where current orchestration techniques stand with regards to the new challenges. We also present a set of promising technologies that can help tame this brave new world

    Venographic comparison of subcutaneous low-molecular weight heparin with oral anticoagulant therapy in the long-term treatment of deep venous thrombosis

    Get PDF
    Producción CientíficaPurpose: The primary objective of this study was to evaluate with venography the rate of thrombus regression after a fixed dose of low–molecular weight heparin (LMWH) per day for 3 months compared with oral anticoagulant therapy for deep venous thrombosis (DVT). Secondary endpoints were the comparisons of the efficacy and safety of both treatments. Methods: This study was designed as an open randomized clinical study in a university hospital setting. Of the 165 patients finally enrolled in the study, 85 were assigned LMWH therapy and 80 were assigned oral anticoagulant therapy. In the group randomized to oral anticoagulant therapy, the patients first underwent treatment in the hospital with standard unfractionated heparin and then coumarin for 3 months. Doses were adjusted with laboratory monitoring to maintain the international normalized ratio between 2.0 and 3.0. Patients in the LMWH group were administered subcutaneous injections of fixed doses of 40 mg enoxaparin (4000 anti-Xa units) every 12 hours for 7 days, and after discharge from the hospital, they were administered 40 mg enoxaparin once daily at fixed doses for 3 months without a laboratory control assay. A quantitative venographic score (Marder score) was used to assess the extent of the venous thrombosis, with 0 points indicating no DVT and 40 points indicating total occlusion of all deep veins. The rate of thrombus reduction was defined as the difference in quantitative venographic scores after termination of LMWH or coumarin therapy as compared with the scores obtained on the initial venographic results. The efficacy was defined as the ability to prevent symptomatic extension or recurrence of venous thromboembolism (documented with venograms or serial lung scans). The safety was defined as the occurrence of hemorrhages. Results: After 3 months of treatment, the mean Marder score was significantly decreased in both groups in comparison with the baseline score, although the effect of therapy was significantly better after LMWH therapy (49.4% reduction) than after coumarin therapy (24.5% reduction; P < .001). LMWH therapy and male gender were independently associated with an enhanced resolution of the thrombus. A lower frequency of symptomatic recurrent venous thromboembolism was also shown in patients who underwent treatment with LMWH therapy (9.5%) than with oral anticoagulant therapy (23.7%; P < .05), although this difference was entirely a result of recurrence of DVT. Bleeding complications were significantly fewer in the LMWH group than in the coumarin group (1.1% vs 10%; P < .05). This difference was caused by minor hemorrhages. Coumarin therapy and cancer were independently associated with an enhanced risk of complications. Subcutaneous heparin therapy was well tolerated by all patients. Conclusion: The patients who were allocated to undergo enoxaparin therapy had a significantly greater improvement in their quantitative venographic score, a significantl

    Oral Consumption of Bread from an RNAi Wheat Line with Strongly Silenced Gliadins Elicits No Immunogenic Response in a Pilot Study with Celiac Disease Patients

    Get PDF
    Celiac disease (CD) is a genetically predisposed, T cell-mediated and autoimmune-like disorder caused by dietary exposure to the storage proteins of wheat and related cereals. A gluten-free diet (GFD) is the only treatment available for CD. The celiac immune response mediated by CD4+ T-cells can be assessed with a short-term oral gluten challenge. This study aimed to determine whether the consumption of bread made using flour from a low-gluten RNAi wheat line (named E82) can activate the immune response in DQ2.5-positive patients with CD after a blind crossover challenge. The experimental protocol included assessing IFN-γ production by peripheral blood mononuclear cells (PBMCs), evaluating gastrointestinal symptoms, and measuring gluten immunogenic peptides (GIP) in stool samples. The response of PBMCs was not significant to gliadin and the 33-mer peptide after E82 bread consumption. In contrast, PBMCs reacted significantly to Standard bread. This lack of immune response is correlated with the fact that, after E82 bread consumption, stool samples from patients with CD showed very low levels of GIP, and the symptoms were comparable to those of the GFD. This pilot study provides evidence that bread from RNAi E82 flour does not elicit an immune response after a short-term oral challenge and could help manage GFD in patients with CD.This research was funded by The Spanish Ministry of Science and Innovation, Agencia Estatal de Investigación (Project PID2019-110847RB-I00), Consejería de Transformación Económica, Industria, Conocimiento y Universidades, Junta de Andalucía (Project P20_01005), and “ERDF A way of making Europe”, by the “European Union”
    corecore